3,379 research outputs found

    Crew workload strategies in advanced cockpits

    Get PDF
    Many methods of measuring and predicting operator workload have been developed that provide useful information in the design, evaluation, and operation of complex systems and which aid in developing models of human attention and performance. However, the relationships between such measures, imposed task demands, and measures of performance remain complex and even contradictory. It appears that we have ignored an important factor: people do not passively translate task demands into performance. Rather, they actively manage their time, resources, and effort to achieve an acceptable level of performance while maintaining a comfortable level of workload. While such adaptive, creative, and strategic behaviors are the primary reason that human operators remain an essential component of all advanced man-machine systems, they also result in individual differences in the way people respond to the same task demands and inconsistent relationships among measures. Finally, we are able to measure workload and performance, but interpreting such measures remains difficult; it is still not clear how much workload is too much or too little nor the consequences of suboptimal workload on system performance and the mental, physical, and emotional well-being of the human operators. The rationale and philosophy of a program of research developed to address these issues will be reviewed and contrasted to traditional methods of defining, measuring, and predicting human operator workload. Viewgraphs are given

    The use of visual cues for vehicle control and navigation

    Get PDF
    At least three levels of control are required to operate most vehicles: (1) inner-loop control to counteract the momentary effects of disturbances on vehicle position; (2) intermittent maneuvers to avoid obstacles, and (3) outer-loop control to maintain a planned route. Operators monitor dynamic optical relationships in their immediate surroundings to estimate momentary changes in forward, lateral, and vertical position, rates of change in speed and direction of motion, and distance from obstacles. The process of searching the external scene to find landmarks (for navigation) is intermittent and deliberate, while monitoring and responding to subtle changes in the visual scene (for vehicle control) is relatively continuous and 'automatic'. However, since operators may perform both tasks simultaneously, the dynamic optical cues available for a vehicle control task may be determined by the operator's direction of gaze for wayfinding. An attempt to relate the visual processes involved in vehicle control and wayfinding is presented. The frames of reference and information used by different operators (e.g., automobile drivers, airline pilots, and helicopter pilots) are reviewed with particular emphasis on the special problems encountered by helicopter pilots flying nap of the earth (NOE). The goal of this overview is to describe the context within which different vehicle control tasks are performed and to suggest ways in which the use of visual cues for geographical orientation might influence visually guided control activities

    Helicopter human factors research

    Get PDF
    Helicopter flight is among the most demanding of all human-machine integrations. The inherent manual control complexities of rotorcraft are made even more challenging by the small margin for error created in certain operations, such as nap-of-the-Earth (NOE) flight, by the proximity of the terrain. Accident data recount numerous examples of unintended conflict between helicopters and terrain and attest to the perceptual and control difficulties associated with low altitude flight tasks. Ames Research Center, in cooperation with the U.S. Army Aeroflightdynamics Directorate, has initiated an ambitious research program aimed at increasing safety margins for both civilian and military rotorcraft operations. The program is broad, fundamental, and focused on the development of scientific understandings and technological countermeasures. Research being conducted in several areas is reviewed: workload assessment, prediction, and measure validation; development of advanced displays and effective pilot/automation interfaces; identification of visual cues necessary for low-level, low-visibility flight and modeling of visual flight-path control; and pilot training

    Helmet-mounted pilot night vision systems: Human factors issues

    Get PDF
    Helmet-mounted displays of infrared imagery (forward-looking infrared (FLIR)) allow helicopter pilots to perform low level missions at night and in low visibility. However, pilots experience high visual and cognitive workload during these missions, and their performance capabilities may be reduced. Human factors problems inherent in existing systems stem from three primary sources: the nature of thermal imagery; the characteristics of specific FLIR systems; and the difficulty of using FLIR system for flying and/or visually acquiring and tracking objects in the environment. The pilot night vision system (PNVS) in the Apache AH-64 provides a monochrome, 30 by 40 deg helmet-mounted display of infrared imagery. Thermal imagery is inferior to television imagery in both resolution and contrast ratio. Gray shades represent temperatures differences rather than brightness variability, and images undergo significant changes over time. The limited field of view, displacement of the sensor from the pilot's eye position, and monocular presentation of a bright FLIR image (while the other eye remains dark-adapted) are all potential sources of disorientation, limitations in depth and distance estimation, sensations of apparent motion, and difficulties in target and obstacle detection. Insufficient information about human perceptual and performance limitations restrains the ability of human factors specialists to provide significantly improved specifications, training programs, or alternative designs. Additional research is required to determine the most critical problem areas and to propose solutions that consider the human as well as the development of technology

    Workshop on Workload and Training, and Examination of their Interactions: Executive summary

    Get PDF
    The goal of the workshop was to bring together experts in the fields of workload and training and representatives from the Dept. of Defense and industrial organizations who are reponsible for specifying, building, and managing advanced, complex systems. The challenging environments and requirements imposed by military helicopter missions and space station operations were presented as the focus for the panel discussions. The workshop permitted a detailed examination of the theoretical foundations of the fields of training and workload, as well as their practical applications. Furthermore, it created a forum where government, industry, and academic experts were able to examine each other's concepts, values, and goals. The discussions pointed out the necessity for a more efficient and effective flow of information among the groups respresented. The executive summary describes the rationale of the meeting, summarizes the primary points of discussion, and lists the participants and some of their summary comments

    Research papers and publications (1981-1987): Workload research program

    Get PDF
    An annotated bibliography of the research reports written by participants in NASA's Workload Research Program since 1981 is presented, representing the results of theoretical and applied research conducted at Ames Research Center and at universities and industrial laboratories funded by the program. The major program elements included: 1) developing an understanding of the workload concept; 2) providing valid, reliable, and practical measures of workload; and 3) creating a computer model to predict workload. The goal is to provide workload-related design principles, measures, guidelines, and computational models. The research results are transferred to user groups by establishing close ties with manufacturers, civil and military operators of aerospace systems, and regulatory agencies; publishing scientific articles; participating in and sponsoring workshops and symposia; providing information, guidelines, and computer models; and contributing to the formulation of standards. In addition, the methods and theories developed have been applied to specific operational and design problems at the request of a number of industry and government agencies

    An integrated approach to rotorcraft human factors research

    Get PDF
    As the potential of civil and military helicopters has increased, more complex and demanding missions in increasingly hostile environments have been required. Users, designers, and manufacturers have an urgent need for information about human behavior and function to create systems that take advantage of human capabilities, without overloading them. Because there is a large gap between what is known about human behavior and the information needed to predict pilot workload and performance in the complex missions projected for pilots of advanced helicopters, Army and NASA scientists are actively engaged in Human Factors Research at Ames. The research ranges from laboratory experiments to computational modeling, simulation evaluation, and inflight testing. Information obtained in highly controlled but simpler environments generates predictions which can be tested in more realistic situations. These results are used, in turn, to refine theoretical models, provide the focus for subsequent research, and ensure operational relevance, while maintaining predictive advantages. The advantages and disadvantages of each type of research are described along with examples of experimental results

    GazeForm: Dynamic Gaze-adaptive Touch Surface for Eyes-free Interaction in Airliner Cockpits

    Get PDF
    An increasing number of domains, including aeronautics, are adopting touchscreens. However, several drawbacks limit their operational use, in particular, eyes-free interaction is almost impossible making it difficult to perform other tasks simultaneously. We introduce GazeForm, an adaptive touch interface with shape-changing capacity that offers an adapted interaction modality according to gaze direction. When the user’s eyes are focused on interaction, the surface is flat and the system acts as a touchscreen. When eyes are directed towards another area, physical knobs emerge from the surface. Compared to a touch only mode, experimental results showed that GazeForm generated a lower subjective mental workload and a higher efficiency of execution (20% faster). Furthermore, GazeForm required less visual attention and participants were able to concentrate more on a secondary monitoring task. Complementary interviews with pilots led us to explore timings and levels of control for using gaze to adapt modality

    Augmented Reality for People with Visual Impairments: Designing and Creating Audio-Tactile Content from Existing Objects

    Get PDF
    ISBN: 978-3-319-94273-5International audienceTactile maps and diagrams are widely used as accessible graphical media for people with visual impairments, in particular in the context of education. They can be made interactive by augmenting them with audio feedback. It is however complicated to create audio-tactile graphics that have rich and realistic tactile textures. To overcome these limitations, we propose a new augmented reality approach allowing novices to easily and quickly augment real objects with audio feedback. In our user study, six teachers created their own audio-augmentation of objects, such as a botanical atlas, within 30 minutes or less. Teachers found the tool easy to use and were confident about re-using it. The resulting augmented objects allow two modes: exploration mode provides feedback on demand about an element, while quiz mode provides questions and answers. We evaluated the resulting audio-tactile material with five visually impaired children. Participants found the resulting interactive graphics exciting to use independently of their mental imagery skills

    Levitation Simulator: Prototyping Ultrasonic Levitation Interfaces in Virtual Reality

    Get PDF
    We present the Levitation Simulator, a system that enables researchers and designers to iteratively develop and prototype levitation interface ideas in Virtual Reality. This includes user tests and formal experiments. We derive a model of the movement of a levitating particle in such an interface. Based on this, we develop an interactive simulation of the levitation interface in VR, which exhibits the dynamical properties of the real interface. The results of a Fitts' Law pointing study show that the Levitation Simulator enables performance, comparable to the real prototype. We developed the first two interactive games, dedicated for levitation interfaces: LeviShooter and BeadBounce, in the Levitation Simulator, and then implemented them on the real interface. Our results indicate that participants experienced similar levels of user engagement when playing the games, in the two environments. We share our Levitation Simulator as Open Source, thereby democratizing levitation research, without the need for a levitation apparatus.Comment: 12 pages, 14 figures, CHI'2
    • …
    corecore